Skip to main content
Glama

mcp-run-python

Official
by pydantic
test_stop_settings[google].yaml1.61 kB
interactions: - request: headers: accept: - '*/*' accept-encoding: - gzip, deflate connection: - keep-alive content-length: - '216' content-type: - application/json host: - generativelanguage.googleapis.com method: POST parsed_body: contents: - parts: - text: What is the capital of France? Give me an answer that contains the word "Paris", but is not the first word. role: user generationConfig: stopSequences: - Paris uri: https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash:generateContent response: headers: alt-svc: - h3=":443"; ma=2592000,h3-29=":443"; ma=2592000 content-length: - '677' content-type: - application/json; charset=UTF-8 server-timing: - gfet4t7; dur=362 transfer-encoding: - chunked vary: - Origin - X-Origin - Referer parsed_body: candidates: - avgLogprobs: -0.5998413562774658 content: parts: - text: 'The most iconic city in France is ' role: model finishReason: STOP modelVersion: gemini-1.5-flash responseId: UB5DaMfEN7jFnvgPocrJaA usageMetadata: candidatesTokenCount: 8 candidatesTokensDetails: - modality: TEXT tokenCount: 8 promptTokenCount: 25 promptTokensDetails: - modality: TEXT tokenCount: 25 totalTokenCount: 33 status: code: 200 message: OK version: 1

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pydantic/pydantic-ai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server